Least-Squares Regression on Sparse Spaces
نویسندگان
چکیده
Another application is when one uses random projections to project each input vector into a lower dimensional space, and then train a predictor in the new compressed space (compression on the feature space). As is typical of dimensionality reduction techniques, this will reduce the variance of most predictors at the expense of introducing some bias. Random projections on the feature space, along with least-squares predictors are studied in [2], and the method is shown to reduces the estimation error at the price of a controlled approximation error. The analysis in [2] provides on-sample error bounds and extends them to bounds on the sampling measure, assuming an i.i.d. sampling strategy.
منابع مشابه
Least Squares Support Vector Machines and Primal Space Estimation
In this paper a methodology for estimation in kernel-induced feature spaces is presented, making a link between the primal-dual formulation of Least Squares Support Vector Machines (LS-SVM) and classical statistical inference techniques in order to perform linear regression in primal space. This is done by computing a finite dimensional approximation of the kernel-induced feature space mapping ...
متن کاملRobust Estimation in Linear Regression with Molticollinearity and Sparse Models
One of the factors affecting the statistical analysis of the data is the presence of outliers. The methods which are not affected by the outliers are called robust methods. Robust regression methods are robust estimation methods of regression model parameters in the presence of outliers. Besides outliers, the linear dependency of regressor variables, which is called multicollinearity...
متن کاملSparse nonlinear discriminants
This thesis considers training algorithms for machine learning and their applications for classification, regression and automatic speech recognition. Particularly, supervised learning, which is also called learning from samples, is considered. Starting with a short introduction into statistical learning theory it is shown, that supervised learning can be formulated as a function estimation pro...
متن کاملSparse partial least squares regression for simultaneous dimension reduction and variable selection
Partial least squares regression has been an alternative to ordinary least squares for handling multicollinearity in several areas of scientific research since the 1960s. It has recently gained much attention in the analysis of high dimensional genomic data. We show that known asymptotic consistency of the partial least squares estimator for a univariate response does not hold with the very lar...
متن کاملMulti-output regression using a locally regularised orthogonal least-squares algorithm - Vision, Image and Signal Processing, IEE Proceedings-
The paper considcrs data modelling using multi-output regression models. A locally regularised orthogonal least-squares (LROLS) algorithm is proposed for constructing sparse multi-output regression models that generalise well. By associating each regressor in the regression model with an individual regularisation parameter, the ability of the multi-output orthogonal least-squares (OLS) model se...
متن کامل